The dot product imprinted burliness, overprinter quasi-three, restore, see double streak-free.
这些产物痕迹、点原、准、无条痕、无轻影。
声明:以上例句、词性分类均由互联网资源自动生成,部分未经过人工审核,其表达内容亦不代表本软件的观点;若发现问题,欢迎向我们指正。
Rather than using the dot product equation, let's use the electric flux equation without the dot product.
不用点方程,而是用不带点的电通量方程。
Electric flux equals the integral of the dot product of electric field and dA.
电通量等于电场与dA的点的分。
However, let's use E dA cosine theta instead of the dot product.
但是,让用 E dA 余弦西塔来代替点。
Electric flux equals the dot product of the electric field and the area, so let's use that.
电通量等于电场与面的点,所就用它。
Solving a linear system with an orthonormal matrix is actually super easy, because dot products are preserved.
用标准正交矩阵求解线性方程组其实非常简单 因为点留了。
That is, they don't preserve that zero dot product.
也就是说 它不留0点。
The dot product before and after the transformation will look very different.
变换前后的点看起来很不一样。
And looking at the example I just showed, dot products certainly aren't preserved.
看看刚才举的例子 点肯定不会留。
The relevant background here is understanding determinance, little bit of dot products, and of course, linear systems of equations.
相关的背景知识是理解行列式 一点点 当然还有线性方程组。
In the sense that applying the transformation is the same thing as taking a dot product with that vector.
在这个意义上应用这个变换就相当于对这个向量做点。
In fact, worthwhile side note here transformations which do preserve dot products are special enough to have their own name.
事实上 值得注意的是留点的变换很特别 有自己的名字。
And if they point in generally the opposite direction, their dot product is negative.
如果它通常指向相反的方向 它的点是负的。
The excordinate of this mystery input vector is what you get by taking its dot product with the 1st basis vector one zero.
这个神秘的输入向量的纵坐标就是它和第一个基向量(0)的点。
Likewise, things that start off perpendicular with dot product zero, like the two basis factors, quite often don't stay perpendicular to each other after the transformation.
同样地 开始与点0垂直的东西 比如两个基因子 在变换之后通常不会相互垂直。
When their perpendicular meaning, the projection of one onto the other, is the zero vector, their dot product is zero.
当它垂直的意思 一个向量在另一个向量上的投影 是零向量时 它的点是零。
And that's probably the most important thing for you to remember about the dot product.
这可能是你要记住的关于点最重要的一点。
Looking at sides 2 and 4, we need to realize the dot product of B and ds, is the same as B ds cosine theta, Right!
观察边2和边4, 需要实现B和ds的点,与B ds cosine theta相同, 对吧!
So that wraps up dot products and cross products.
这就包含了点和叉乘。
So that performing the linear transformation is the same as taking a dot product with that vector the cross product.
所进行线性变换就等于对这个向量做点也就是叉乘。
Now, this numerical operation of multiplying a one by two matrix by a vector feels just like taking the dot product of two vectors.
现在 这个1×2矩阵乘一个向量的数值运算就像取两个向量的点。
关注我们的微信
下载手机客户端
划词翻译
详细解释